Tags: deep learning*

0 bookmark(s) - Sort by: Date ↓ / Title /

  1. BEAL is a deep active learning method that uses Bayesian deep learning with dropout to infer the model’s posterior predictive distribution and introduces an expected confidence-based acquisition function to select uncertain samples. Experiments show that BEAL outperforms other active learning methods, requiring fewer labeled samples for efficient training.
  2. Pete Warden shares his experience and knowledge about the memory layout of the Raspberry Pi Pico board, specifically the RP2040 microcontroller. He encountered baffling bugs while updating TensorFlow Lite Micro and traced them to poor understanding of the memory layout. The article provides detailed insights into the physical and RAM layouts, stack behavior, and potential pitfalls.
  3. A detailed overview of the architecture, Python implementation, and future of autoencoders, focusing on their use in feature extraction and dimension reduction in unsupervised learning.
  4. Researchers have mapped the complete neural connectome of a fruit fly, detailing all 139,255 nerve cells and their connections. This advance offers insights into how the brain processes information.
  5. Exploring popular reinforcement learning environments in a beginner-friendly way, focusing on the Q-learning method to solve the 'Frozen Lake' environment.
  6. This article introduces the Bayesian Neural Field (BayesNF), a method combining deep neural networks with hierarchical Bayesian inference for scalable and flexible analysis of spatiotemporal data, such as environmental monitoring and cloud demand forecasting.
  7. "We present a systematic review of some of the popular machine learning based email spam filtering approaches."

    "Our review covers survey of the important concepts, attempts, efficiency, and the research trend in spam filtering."
  8. Learn how to set up the Raspberry Pi AI Kit with the new Raspberry Pi 5. The kit allows you to explore machine learning and AI concepts using Python and TensorFlow.
  9. Generate realistic sequential data with this easy-to-train model. This article explores using Variational Autoencoders (VAEs) to model and generate time series data. It details the specific architecture choices, like 1D convolutional layers and a seasonally dependent prior, used to capture the periodic and sequential patterns in temperature data.
  10. This paper presents a method to accelerate the grokking phenomenon, where a model's generalization improves with more training iterations after an initial overfitting stage. The authors propose a simple algorithmic modification to existing optimizers that filters out the fast-varying components of the gradients and amplifies the slow-varying components, thereby accelerating the grokking effect.

Top of the page

First / Previous / Next / Last / Page 1 of 0 SemanticScuttle - klotz.me: tagged with "deep learning"

About - Propulsed by SemanticScuttle